Saving Keras Model to Google Cloud Storage

by CM


Posted on May 09, 2020



The Goal:

In this article, we will quickly explore how to save a Keras model to Google Cloud Storage. This is especially useful in case you want to deploy your model and make it accessible online.

Google Cloud Storage:

Google Cloud Storage is a RESTful online file storage web service for storing and accessing data on Google Cloud Platform infrastructure. The service combines the performance and scalability of Google's cloud with advanced security and sharing capabilities. It is an Infrastructure as a Service (IaaS), comparable to Amazon S3 online storage service. Contrary to Google Drive and according to different service specifications, Google Cloud Storage appears to be more suitable for enterprises.


Key components are:

Let's jump right into creating a Cloud Storage Bucket. We need buckets as data is stored in so-called GCS buckets. These buckets can be seen as containers that hold the data. Buckets can be created in various ways: For example: (1) In the Google Cloud Storage UI, (2) using Google Colab, (3) using the Google Cloud Console, etc.

### As an example, I created a new bucket with the Cloud Console.
###Note: This is not done in Python but in the Cloud Console or Cloud SDK.

BUCKET_NAME ='my_new_bucket'
PROJECT_NAME = 'project-12345'
MODEL_NAME = 'my_model'
MODEL_VERSION = 'v1.0'
PYTHON_VERSION = '3.7'
REGION = 'us-central1'

Second, we jump into the Python code and start importing our dependency.

#For saving the model
import tensorflow as tf
from  tensorflow.keras.models import Sequential

#For Authentification in GCS
import os
from google.colab import auth as google_auth

Luckily, we have created an h5 model in the last article that we will now upload to Google Cloud Storage. To prepare that, we need to do two things: A) Prepare Authentification in GCS, B) Read the model from our local disk or Colab file explorer into Python.

#Authentification
#Copy the provided Authentification code in the respective web input field.
google_auth.authenticate_user()

#Reading the model from local disk / Colab file explorer.
my_model = tf.keras.models.load_model('my_model.h5')

Last step is just uploading the model to our new GCS bucket.

BUCKET_NAME = 'my_new_bucket'
FOLDER_NAME = 'my_new_folder'

GS_PATH = 'gs://' + BUCKET_NAME + '/' + FOLDER_NAME
export_path = tf.saved_model.save(my_model, GS_PATH)

Done. We will now have successfully saved our model on GCS. You can make it publicly availabe by changing the Access Permission Rights in GCS.

Make ML Public! #Google Cloud Storage

#EpicML


News
Dec 2021

--- Quantum ---

Simulating matter on the quantum scale with AI #Deepmind
Nov 2021

--- Graviton3 ---

Amazon announced its Graviton3 processors for AI inferencing - the next generation of its custom ARM-based chip for AI inferencing applications. #Graviton3
May 2021

--- Vertex AI & TPU Gen4. ---

Google announced its fourth generation of tensor processing units (TPUs) for AI and ML workloads and the Vertex AI managed platform #VertexAI #TPU
Feb 2021

--- TensorFlow 3D ---

In February of 2021, Google released TensorFlow 3D to help enterprises develop and train models capable of understanding 3D scenes #TensorFlow3D
Nov 2020

--- AlphaFold ---

In November of 2020, AlphaFold 2 was recognised as a solution to the protein folding problem at CASP14 #protein_folding
Oct 2019

--- Google Quantum ---

A research effort from Google AI that aims to build quantum processors and develop novel quantum algorithms to dramatically accelerate computational tasks for machine learning. #quantum_supremacy
Oct 2016

--- AlphaGo ---

Mastering the game of Go with Deep Neural Networks. #neural_network